Senior Big Data Developer - $130-150k - Hybrid - Pheonix AZ, New York, Fort Lauderdale.
Role Overview: We are seeking a skilled Senior Big Data Developer to join our team. The ideal candidate will bring extensive experience in big data processing and analysis, with at least 3 years of hands-on expertise in Apache Spark. You will be responsible for designing, implementing, and optimizing data pipelines and applications that handle large-scale data in real-time and batch processing environments.
Key Responsibilities:
- Data Pipeline Development: Design, develop, and maintain large-scale, distributed data pipelines using Apache Spark.
- Performance Optimization: Implement best practices for optimizing Spark jobs for performance and scalability.
- Integration: Work with diverse data sources, including HDFS, NoSQL databases, relational databases, and cloud storage.
- Collaboration: Partner with data scientists, analysts, and stakeholders to understand requirements and deliver data solutions.
- Real-time Processing: Develop real-time data streaming applications using Spark Streaming or similar technologies.
- Code Quality: Write clean, maintainable, and reusable code, following best practices for version control, testing, and documentation.
- Troubleshooting: Identify and resolve issues in Spark jobs and data pipelines.
Required Skills and Qualifications:
Experience:
- 8 to 12 years in software development, with a focus on big data solutions.
- 3+ years of hands-on experience in Apache Spark (batch and streaming).
Technical Skills:
- Proficient in programming languages such as Scala, Python, or Java.
- Strong understanding of distributed computing principles.
- Experience with big data ecosystems (Hadoop, HDFS, Hive, Kafka).
- Familiarity with cloud platforms like AWS, Azure, or GCP.
- Proficient in SQL and experience with relational databases.
Tools and Frameworks:
- Version control systems (Git, GitHub, Bitbucket).
- CI/CD pipelines and DevOps practices.
Soft Skills:
- Excellent problem-solving and analytical skills.
- Strong communication and teamwork abilities.
Others:
- Knowledge of machine learning frameworks and integration with Spark (e.g., MLlib).
- Experience with containerization and orchestration tools (Docker, Kubernetes).
- Certifications in Big Data or Cloud technologies.
- Experience in developing solutions for Payment Networks and Payment or Financial transaction processing.